home *** CD-ROM | disk | FTP | other *** search
- Path: news.ov.com!news
- From: glenn@ov.com (Fletcher.Glenn@ov.com)
- Newsgroups: comp.lang.c
- Subject: Re: Sorting huge files
- Date: 2 Jan 1996 21:06:58 GMT
- Organization: OpenVision
- Message-ID: <4cc6pi$dbk@spanky.pls.ov.com>
- References: <4c483b$ckd@longwood.cs.ucf.edu>
- Reply-To: glenn@ov.com
- NNTP-Posting-Host: foghorn.pls.ov.com
-
- In article ckd@longwood.cs.ucf.edu, schnitzi@longwood.cs.ucf.edu (Mark Schnitzius) writes:
- >glenn@ov.com (Fletcher.Glenn@ov.com) writes:
- >
- >>In article 317@news.dx.net, Todd Swanson <tswan@sinnfree.sinnfree.org> writes:
- >>>I'm having troubles sorting large files. Smaller files are no problem I
- >>>can read the fields into arrays. Then sort the pointers to the arrays
- >>><simple>. However, larger files are a problem because I cant store the
- >>>entire file into an array. Is there any one who has some source, or
- >>>could explain an algorithm to help me? I was thinking of using memory as
- >>>a buffer, and use the disk as a swap space; perhaps, a "virtual sort"? I
- >>>have been using bubble sort logic for this virtual sort, but I'm not sure
- >>>how to sort the file in chunks, yet manage to sort the entire file.
- >
- >>Consider the following:
- >
- >>1. Read in a chunk of the file, and sort that chunk. Write the sorted
- >> chunk to another file.
- >
- >[...etc...]
- >
- >Better still to just use fseek and ftell to move about in the file.
- >Make an int array of lines where you save the result of an ftell
- >before reading that line. Then you can use fseek to go back and
- >read that line in again before making a comparison. Not terribly
- >efficient but it gets the job done.
- >
- >
- >_____________________________________________________________
- >mark schnitzius - - - - - - - - - - - - - schnitzi@mentos.com
- >- - - -<a href="http://east.isx.com/~schnitzi/">me</a>- - - -
-
-
- Tell me, exactly how does your method result in a sorted file?
-
- Fletcher.Glenn@ov.com
-
-
-